Local Training and Belief Propagation

نویسندگان

  • Charles Sutton
  • Tom Minka
چکیده

Because maximum-likelihood training is intractable for general factor graphs, an appealing alternative is local training, which approximates the likelihood gradient without performing global propagation on the graph. We discuss two new local training methods: shared-unary piecewise, in which unary factors are shared among every higher-way factor that they neighbor, and the one-step cutout method, which computes exact marginals on overlapping subgraphs. Comparing them to naive piecewise training, we show that just as piecewise training corresponds to using the Bethe pseudomarginals after zero BP iterations, shared-unary piecewise corresponds to the pseudomarginals after one parallel iteration, and the one-step cutout method corresponds to the beliefs after two iterations. We show in simulations that this point of view illuminates the errors made by shared-unary piecewise.

برای دانلود رایگان متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

ثبت نام

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

منابع مشابه

Piecewise Training for Undirected Models

For many large undirected models that arise in real-world applications, exact maximumlikelihood training is intractable, because it requires computing marginal distributions of the model. Conditional training is even more difficult, because the partition function depends not only on the parameters, but also on the observed input, requiring repeated inference over each training example. An appea...

متن کامل

Human Pose Estimation in Vision Networks Via Distributed Local Processing and Nonparametric Belief Propagation

In this paper we propose a self-initialized method for human pose estimation from multiple cameras. A graphical model for the articulated body is defined through explicit kinematic and structural constraints, which allows for any plausible body configuration and avoids learning the joint distributions from training data. Nonparametric belief propagation (NBP) is used to infer the marginal distr...

متن کامل

MapReduce Lifting for Belief Propagation

Judging by the increasing impact of machine learning on large-scale data analysis in the last decade, one can anticipate a substantial growth in diversity of the machine learning applications for “big data” over the next decade. This exciting new opportunity, however, also raises many challenges. One of them is scaling inference within and training of graphical models. Typical ways to address t...

متن کامل

On Loopy Belief Propagation - Local Stability Analysis for Non-Vanishing Fields

In this work we obtain all fixed points of belief propagation and perform a local stability analysis. We consider pairwise interactions of binary random variables and investigate the influence of non-vanishing fields and finite-size graphs on the performance of belief propagation; local stability is heavily influenced by these properties. We show why non-vanishing fields help to achieve converg...

متن کامل

Learning Deep Inference Machines

Introduction. The traditional approach to structured prediction problems is to craft a graphical model structure, learn parameters for the model, and perform inference using an efficient– and usually approximate– inference approach, including, e.g., graph cut methods, belief propagation, and variational methods. Unfortunately, while remarkably powerful methods for inference have been developed ...

متن کامل

ذخیره در منابع من


  با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید

عنوان ژورنال:

دوره   شماره 

صفحات  -

تاریخ انتشار 2006